612 research outputs found
Proceedings of the Workshop on Change of Representation and Problem Reformulation
The proceedings of the third Workshop on Change of representation and Problem Reformulation is presented. In contrast to the first two workshops, this workshop was focused on analytic or knowledge-based approaches, as opposed to statistical or empirical approaches called 'constructive induction'. The organizing committee believes that there is a potential for combining analytic and inductive approaches at a future date. However, it became apparent at the previous two workshops that the communities pursuing these different approaches are currently interested in largely non-overlapping issues. The constructive induction community has been holding its own workshops, principally in conjunction with the machine learning conference. While this workshop is more focused on analytic approaches, the organizing committee has made an effort to include more application domains. We have greatly expanded from the origins in the machine learning community. Participants in this workshop come from the full spectrum of AI application domains including planning, qualitative physics, software engineering, knowledge representation, and machine learning
Symmetry as Bias: Rediscovering Special Relativity
This paper describes a rational reconstruction of Einstein's discovery of special relativity, validated through an implementation: the Erlanger program. Einstein's discovery of special relativity revolutionized both the content of physics and the research strategy used by theoretical physicists. This research strategy entails a mutual bootstrapping process between a hypothesis space for biases, defined through different postulated symmetries of the universe, and a hypothesis space for physical theories. The invariance principle mutually constrains these two spaces. The invariance principle enables detecting when an evolving physical theory becomes inconsistent with its bias, and also when the biases for theories describing different phenomena are inconsistent. Structural properties of the invariance principle facilitate generating a new bias when an inconsistency is detected. After a new bias is generated. this principle facilitates reformulating the old, inconsistent theory by treating the latter as a limiting approximation. The structural properties of the invariance principle can be suitably generalized to other types of biases to enable primal-dual learning
Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design
The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface
Generation and Exploitation of Aggregation Abstractions for Scheduling and Resource Allocation
Our research is investigating abstraction of computational theories for scheduling and resource allocation. These theories are represented in a variant of first order predicate calculus, parameterized multisorted logic, that facilitates specification of large problems. A particular problem is conceptually stated as a set of ground sentences that are consistent with a quantified theory. We are mainly investigating the automated generation of aggregation abstractions and approximations in which detailed resource allocation constraints are replaced by constraints between aggregate demand and capacity. We are also investigating the interaction of aggregation abstractions with the more thoroughly investigated abstractions of weakening operator preconditions. The purpose of the theories for aggregated demand/capacity is threefold: first, to answer queries about aggregate properties, such as gross feasibility; second, to reduce computational costs by using the solution of aggregate problems to guide the solution of detailed problems; and third, to facilitate reformulating theories to approximate problems for which there are efficient problem solving methods. We also describe novel methods for exploiting aggregation abstractions
Fabrication of Micropatterned Hydrogels for Neural Culture Systems using Dynamic Mask Projection Photolithography
Increasingly, patterned cell culture environments are becoming a relevant technique to study cellular characteristics, and many researchers believe in the need for 3D environments to represent in vitro experiments which better mimic in vivo qualities 1-3. Studies in fields such as cancer research 4, neural engineering 5, cardiac physiology 6, and cell-matrix interaction7,8have shown cell behavior differs substantially between traditional monolayer cultures and 3D constructs
Fast, Interactive Worst-Case Execution Time Analysis With Back-Annotation
Abstract—For hard real-time systems, static code analysis is needed to derive a safe bound on the worst-case execution time (WCET). Virtually all prior work has focused on the accuracy of WCET analysis without regard to the speed of analysis. The resulting algorithms are often too slow to be integrated into the development cycle, requiring WCET analysis to be postponed until a final verification phase. In this paper we propose interactive WCET analysis as a new method to provide near-instantaneous WCET feedback to the developer during software programming. We show that interactive WCET analysis is feasible using tree-based WCET calculation. The feedback is realized with a plugin for the Java editor jEdit, where the WCET values are back-annotated to the Java source at the statement level. Comparison of this treebased approach with the implicit path enumeration technique (IPET) shows that tree-based analysis scales better with respect to program size and gives similar WCET values. Index Terms—Real time systems, performance analysis, software performance, software reliability, software algorithms, safety I
Model-based System Health Management and Contingency Planning for Autonomous UAS
Safe autonomous operations of an Unmanned Aerial System (UAS) requires that the UAS can react to unforeseen circumstances, for example, after a failure has occurred. In this paper we describe a model-based run-time architecture for autonomous on-board diagnosis, system health management, and contingency management. This architecture is being instantiated on top of NASA's Core Flight System (cFS/cFE) as amajor component of the on-board AutonomousOperating System (AOS). We will describe our diagnosis and monitoring components, which continuously provide system health status. Automated reasoning with constraint satisfaction form the core of our decision-making component, which assesses the current situation, aids in failure disambiguation, and constructs a contingency plan to mitigate the failure(s) and allow for a safe end of the mission. We will illustrate our contingency management system with two case studies, one for a fixed-wing aircraft in simulation, and one for an autonomous DJI S1000+ octo-copter
Scientific Value of Real-Time Global Positioning System Data
The Global Positioning System (GPS) is an example of a Global Navigation Satellite System (GNSS) that provides an essential complement to other geophysical networks because of its high precision, sensitivity to the longest‐period bands, ease of deployment, and ability to measure displacement and atmospheric properties over local to global scales. Recent and ongoing technical advances, combined with decreasing equipment and data acquisition costs, portend rapid increases in accessibility of data from expanding global geodetic networks. Scientists and the public are beginning to have access to these high‐rate, continuous data streams and event‐specific information within seconds to minutes rather than days to months. These data provide the opportunity to observe Earth system processes with greater accuracy and detail, as they occur
Ensemble Properties of Comets in the Sloan Digital Sky Survey
We present the ensemble properties of 31 comets (27 resolved and 4
unresolved) observed by the Sloan Digital Sky Survey (SDSS). This sample of
comets represents about 1 comet per 10 million SDSS photometric objects.
Five-band (u,g,r,i,z) photometry is used to determine the comets' colors,
sizes, surface brightness profiles, and rates of dust production in terms of
the Af{\rho} formalism. We find that the cumulative luminosity function for the
Jupiter Family Comets in our sample is well fit by a power law of the form N(<
H) \propto 10(0.49\pm0.05)H for H < 18, with evidence of a much shallower fit
N(< H) \propto 10(0.19\pm0.03)H for the faint (14.5 < H < 18) comets. The
resolved comets show an extremely narrow distribution of colors (0.57 \pm 0.05
in g - r for example), which are statistically indistinguishable from that of
the Jupiter Trojans. Further, there is no evidence of correlation between color
and physical, dynamical, or observational parameters for the observed comets.Comment: 19 pages, 8 tables, 11 figures, to appear in Icaru
Searching for stochastic gravitational-wave background with the co-located LIGO interferometers
This paper presents techniques developed by the LIGO Scientific Collaboration
to search for the stochastic gravitational-wave background using the co-located
pair of LIGO interferometers at Hanford, WA. We use correlations between
interferometers and environment monitoring instruments, as well as time-shifts
between two interferometers (described here for the first time) to identify
correlated noise from non-gravitational sources. We veto particularly noisy
frequency bands and assess the level of residual non-gravitational coupling
that exists in the surviving data.Comment: Proceedings paper from the 7th Edoardo Amaldi Conference on
Gravitational Waves, held in Sydney, Australia from 8-14 July 2007. Accepted
to J. Phys.: Conf. Se
- …